Extrapolation estimates for entropy numbers
نویسندگان
چکیده
منابع مشابه
Entropy Rate Estimates for Natural Language - A New Extrapolation of Compressed Large-Scale Corpora
One of the fundamental questions about human language is whether its entropy rate is positive. The entropy rate measures the average amount of information communicated per unit time. The question about the entropy of language dates back to experiments by Shannon in 1951, but in 1990 Hilberg raised doubt regarding a correct interpretation of these experiments. This article provides an in-depth e...
متن کاملEstimates for Wieferich Numbers
We define Wieferich numbers to be those odd integers w ≥ 3 that satisfy the congruence 2φ(w) ≡ 1 (mod w2). It is clear that the distribution of Wieferich numbers is closely related to the distribution of Wieferich primes, and we give some quantitative forms of this statement. We establish several unconditional asymptotic results about Wieferich numbers; analogous results for the set of Wieferic...
متن کاملRényi Extrapolation of Shannon Entropy
Relations between Shannon entropy and Rényi entropies of integer order are discussed. For any N–point discrete probability distribution for which the Rényi entropies of order two and three are known, we provide an lower and an upper bound for the Shannon entropy. The average of both bounds provide an explicit extrapolation for this quantity. These results imply relations between the von Neumann...
متن کاملEntropy Estimates for Dynamical Systems
We apply a method proposed recently for estimat ing entropie s of sy mbol sequences to two ensembles of binary seq uences obtained from dynamical sys tems. T he first is th e ensemble of (0,1)sequences in a generating part it ion of the Henon map ; th e second is th e ensemb le of spat ial st rings in cellular automaton 22 in t he stat ist ica lly stat ionary state. In both cases , t he ent rop...
متن کاملEntropy Estimates for Generative Models
Different approaches to generative modeling entail different approaches to evaluation. While some models admit test likelihood estimation, for others only proxy metrics for visual quality are being reported. In this paper, we propose a simple method to compute differential entropy of an arbitrary decoder-based generative model. Using this approach, we found that models with qualitatively differ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Functional Analysis
سال: 2012
ISSN: 0022-1236
DOI: 10.1016/j.jfa.2012.09.016